# Small-scale pretraining
Llama 3.2 400M Amharic
This is a streamlined version based on Meta's Llama-3.2-1B model, specifically pretrained for Amharic with 400 million parameters and a context length of 1024 tokens.
Large Language Model
Transformers Other

L
rasyosef
310
3
Vit Betwixt Patch32 Clip 224.tinyclip Laion400m
MIT
A small CLIP model based on ViT architecture, suitable for zero-shot image classification tasks, trained on the LAION-400M dataset.
Image Classification
V
timm
113
1
GPT NeoX 1.3B Viet Final GGUF
1.3B parameter GPT-NeoX model pretrained on 31.3GB Vietnamese data
Large Language Model English
G
afrideva
170
1
Roberta Base 10M 1
RoBERTa series models pretrained on datasets of varying scales (1M-1B tokens), including BASE and MED-SMALL specifications
Large Language Model
R
nyu-mll
13
1
It5 Small
Apache-2.0
IT5 is the first family of sequence-to-sequence Transformer models pretrained at scale for Italian, following the approach of the original T5 model.
Large Language Model Other
I
gsarti
220
2
Roberta Base 100M 1
A RoBERTa base model pre-trained on 1B tokens with a validation perplexity of 3.93, suitable for English text processing tasks.
Large Language Model
R
nyu-mll
63
0
Roformer Chinese Char Small
RoFormer is a Chinese Transformer model enhanced with Rotary Position Embedding, suitable for text infilling tasks.
Large Language Model Chinese
R
junnyu
24
0
Gpt2 Small Indonesian 522M
MIT
This is a GPT2-small model pretrained on Indonesian Wikipedia data, specializing in Indonesian text generation tasks.
Large Language Model Other
G
cahya
1,900
9
Kinyaroberta Small
This is a RoBERTa model pretrained on Kinyarwanda datasets using Masked Language Modeling (MLM) objective, with case-insensitive tokenization during pretraining.
Large Language Model
Transformers

K
jean-paul
38
0
Roberta Med Small 1M 1
A RoBERTa model pretrained on a small-scale dataset of 1M tokens, using the MED-SMALL architecture, suitable for text understanding tasks.
Large Language Model
R
nyu-mll
23
1
Featured Recommended AI Models